# Multi-programming language support

Seed Coder 8B Reasoning Bf16
MIT
Seed-Coder is an 8B-scale open-source code model family, including base, instruction, and reasoning versions. The reasoning version enhances reasoning capabilities through reinforcement learning training and supports 64K context length.
Large Language Model Transformers
S
ByteDance-Seed
4,382
9
Modularstarencoder
Openrail
A 1-billion parameter code encoder pre-trained on The Stack v2 dataset, featuring modular design and bidirectional self-attention mechanism
Large Language Model Transformers
M
modularStarEncoder
147
1
Granite 3b Code Instruct 128k
Apache-2.0
Granite-3B-Code-Instruct-128K is a long-context instruction model with 3 billion parameters, fine-tuned from Granite-3B-Code-Base-128K, specializing in coding-related tasks.
Large Language Model Transformers
G
ibm-granite
1,516
10
Codellama 34b Instruct Hf
Code Llama is Meta's 34 billion parameter instruction-tuned code generation model, specifically designed for general code synthesis and understanding
Large Language Model Transformers Other
C
codellama
20.29k
286
Codellama 34b Hf
Code Llama is a series of large language models for code generation and understanding developed by Meta, with the 34B version being the 34-billion-parameter base model
Large Language Model Transformers Other
C
codellama
11.90k
169
Code Trans T5 Large Code Documentation Generation Java Multitask
A Java code documentation generation model based on the T5 large architecture, supporting multi-task training and excelling in generating Java function descriptions
Large Language Model
C
SEBIS
13
0
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase